Goto

Collaborating Authors

 statistical recurrent model


A Statistical Recurrent Model on the Manifold of Symmetric Positive Definite Matrices

Neural Information Processing Systems

In a number of disciplines, the data (e.g., graphs, manifolds) to be analyzed are non-Euclidean in nature. Geometric deep learning corresponds to techniques that generalize deep neural network models to such non-Euclidean spaces. Several recent papers have shown how convolutional neural networks (CNNs) can be extended to learn with graph-based data. In this work, we study the setting where the data (or measurements) are ordered, longitudinal or temporal in nature and live on a Riemannian manifold -- this setting is common in a variety of problems in statistical machine learning, vision and medical imaging. We show how recurrent statistical recurrent network models can be defined in such spaces. We give an efficient algorithm and conduct a rigorous analysis of its statistical properties. We perform extensive numerical experiments demonstrating competitive performance with state of the art methods but with significantly less number of parameters. We also show applications to a statistical analysis task in brain imaging, a regime where deep neural network models have only been utilized in limited ways.


Reviews: A Statistical Recurrent Model on the Manifold of Symmetric Positive Definite Matrices

Neural Information Processing Systems

The submission #5332, entitled "Statistical Recurrent Models on Manifold valued Data", presents a framework for the fit of Recurrent Neural networks on SPD matrices. Much of the efforts are spent on deriving the framework for these computations. Particular attention is paid to the estimation of Fréchet mean for this type of data, in order to obtain a fast yet reliable estimators. All these data are very low-dimensional, so that the reported algorithmic optimization unlikely to be useful in practical settings. The experiments show that the proposed approach is accurate in the problems considered and requires fewer iterations for convergence.


A Statistical Recurrent Model on the Manifold of Symmetric Positive Definite Matrices

Chakraborty, Rudrasis, Yang, Chun-Hao, Zhen, Xingjian, Banerjee, Monami, Archer, Derek, Vaillancourt, David, Singh, Vikas, Vemuri, Baba

Neural Information Processing Systems

In a number of disciplines, the data (e.g., graphs, manifolds) to be analyzed are non-Euclidean in nature. Geometric deep learning corresponds to techniques that generalize deep neural network models to such non-Euclidean spaces. Several recent papers have shown how convolutional neural networks (CNNs) can be extended to learn with graph-based data. In this work, we study the setting where the data (or measurements) are ordered, longitudinal or temporal in nature and live on a Riemannian manifold -- this setting is common in a variety of problems in statistical machine learning, vision and medical imaging. We show how recurrent statistical recurrent network models can be defined in such spaces.